A Quadratic Loss Multi-Class SVM for which a Radius-Margin Bound Applies

نویسندگان

  • Yann Guermeur
  • Emmanuel Monfrini
چکیده

To set the values of the hyperparameters of a support vector machine (SVM), the method of choice is cross-validation. Several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. One of the most popular is the radius–margin bound. It applies to the hard margin machine, and, by extension, to the 2-norm SVM. In this article, we introduce the first quadratic loss multi-class SVM: the M-SVM. It can be seen as a direct extension of the 2-norm SVM to the multi-class case, which we establish by deriving the corresponding generalized radius–margin bound.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Quadratic Loss Multi-Class SVM

Using a support vector machine requires to set two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirement. To overcome thi...

متن کامل

Convex formulations of radius-margin based Support Vector Machines

We consider Support Vector Machines (SVMs) learned together with linear transformations of the feature spaces on which they are applied. Under this scenario the radius of the smallest data enclosing sphere is no longer fixed. Therefore optimizing the SVM error bound by considering both the radius and the margin has the potential to deliver a tighter error bound. In this paper we present two nov...

متن کامل

Radius-margin Bound on the Leave-one- out Error of the Llw-m-svm

To set the values of the hyperparameters of a support vector machine (SVM), one can use cross-validation. Its leave-one-out variant produces an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirement. To overcome this difficulty, several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. The most pop...

متن کامل

Radius-Margin Bound on the Leave-One-Out Error of a M-SVM

Using a support vector machine (SVM) requires to set the values of two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirem...

متن کامل

Learning Kernel Parameters by using Class Separability Measure

Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Informatica, Lith. Acad. Sci.

دوره 22  شماره 

صفحات  -

تاریخ انتشار 2011